On the capabilities of multilayer perceptrons
نویسندگان
چکیده
منابع مشابه
On Langevin Updating in Multilayer Perceptrons
The Langevin updating rule, in which noise is added to the weights during learning, is presented and shown to improve learning on problems with initially ill-conditioned Hessians. This is particularly important for multilayer perceptrons with many hidden layers, that often have ill-conditioned Hessians. In addition, Manhattan updating is shown to have a similar eeect.
متن کاملFast training of multilayer perceptrons
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This paper describes a new approach which is much faster and certain than error backpropagation. The proposed approach is based on combined iterative and direct solution methods. In this approach, we use an inverse transformation for linearization of nonlinear output activation functions, direct soluti...
متن کاملAlternate Learning Algorithm on Multilayer Perceptrons
Multilayer perceptrons have been applied successfully to solve some difficult and diverse problems with the backpropagation learning algorithm. However, the algorithm is known to have slow and false convergence aroused from flat surface and local minima on the cost function. Many algorithms announced so far to accelerate convergence speed and avoid local minima appear to pay some trade-off for ...
متن کاملMultilayer Perceptrons based on Fuzzy Flip- Flops
The concept of fuzzy flip-flop was introduced in the middle of 1980’s by Hirota (with his students). The Hirota Lab recognized the essential importance of the concept of a fuzzy extension of a sequential circuit and the notion of fuzzy memory. From this point of view they proposed alternatives for “fuzzifying” digital flip-flops. The starting elementary digital units were the binary J-K flipflo...
متن کاملQuantile regression with multilayer perceptrons
We consider nonlinear quantile regression involving multilayer perceptrons (MLP). In this paper we investigate the asymptotic behavior of quantile regression in a general framework. First by allowing possibly non-identifiable regression models like MLP's with redundant hidden units, then by relaxing the conditions on the density of the noise. In this paper, we present an universal bound for the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Complexity
سال: 1988
ISSN: 0885-064X
DOI: 10.1016/0885-064x(88)90020-9